On accelerating the regularized alternating least-squares algorithm for tensors

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optimal Rates for Regularized Least-squares Algorithm

We develop a theoretical analysis of the generalization performances of regularized least-squares algorithm on a reproducing kernel Hilbert space in the supervised learning setting. The presented results hold in the general framework of vector-valued functions, therefore they can be applied to multi-task problems. In particular we observe that the concept of effective dimension plays a central ...

متن کامل

Fast Rates for Regularized Least-squares Algorithm

We develop a theoretical analysis of generalization performances of regularized leastsquares on reproducing kernel Hilbert spaces for supervised learning. We show that the concept of effective dimension of an integral operator plays a central role in the definition of a criterion for the choice of the regularization parameter as a function of the number of samples. In fact a minimax analysis is...

متن کامل

A Regularized Total Least Squares Algorithm

Error-contaminated systems Ax ≈ b, for which A is ill-conditioned, are considered. Such systems may be solved using Tikhonov-like regularized total least squares (R-TLS) methods. Golub et al, 1999, presented a direct algorithm for the solution of the Lagrange multiplier formulation for the R-TLS problem. Here we present a parameter independent algorithm for the approximate R-TLS solution. The a...

متن کامل

Some Convergence Results on the Regularized Alternating Least-Squares Method for Tensor Decomposition

We study the convergence of the Regularized Alternating Least-Squares algorithm for tensor decompositions. As a main result, we have shown that given the existence of critical points of the Alternating Least-Squares method, the limit points of the converging subsequences of the RALS are the critical points of the least squares cost functional. Some numerical examples indicate a faster convergen...

متن کامل

Regularized Least-Squares Classification

We consider the solution of binary classification problems via Tikhonov regularization in a Reproducing Kernel Hilbert Space using the square loss, and denote the resulting algorithm Regularized Least-Squares Classification (RLSC). We sketch the historical developments that led to this algorithm, and demonstrate empirically that its performance is equivalent to that of the well-known Support Ve...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: ETNA - Electronic Transactions on Numerical Analysis

سال: 2018

ISSN: 1068-9613,1068-9613

DOI: 10.1553/etna_vol48s1